Learning geometric combinations of Gaussian kernels with alternating Quasi-Newton algorithm
نویسندگان
چکیده
We propose a novel algorithm for learning a geometric combination of Gaussian kernel jointly with a SVM classifier. This problem is the product counterpart of MKL, with restriction to Gaussian kernels. Our algorithm finds a local solution by alternating a Quasi-Newton gradient descent over the kernels and a classical SVM solver over the instances. We show promising results on well known data sets which suggest the soundness of the approach.
منابع مشابه
Blind Deconvolution Using the Relative Newton Method
We propose a relative optimization framework for quasi maximum likelihood blind deconvolution and the relative Newton method as its particular instance. Special Hessian structure allows its fast approximate construction and inversion with complexity comparable to that of gradient methods. The use of rational IIR restoration kernels provides a richer family of filters than the traditionally used...
متن کاملOn the convergence speed of artificial neural networks in the solving of linear systems
Artificial neural networks have the advantages such as learning, adaptation, fault-tolerance, parallelism and generalization. This paper is a scrutiny on the application of diverse learning methods in speed of convergence in neural networks. For this aim, first we introduce a perceptron method based on artificial neural networks which has been applied for solving a non-singula...
متن کاملOnline Learning with (Multiple) Kernels: A Review
This review examines kernel methods for online learning, in particular, multiclass classification. We examine margin-based approaches, stemming from Rosenblatt's original perceptron algorithm, as well as nonparametric probabilistic approaches that are based on the popular gaussian process framework. We also examine approaches to online learning that use combinations of kernels--online multiple ...
متن کاملBest multilinear rank approximation of tensors with quasi-Newton methods on Grassmannians
In this report we present computational methods for the best multilinear rank approximation problem. We consider algorithms build on quasi-Newton methods operating on product of Grassmann manifolds. Specifically we test and compare methods based on BFGS and L-BFGS updates in local and global coordinates with the Newton-Grassmann and alternating least squares methods. The performance of the quas...
متن کاملFast Rates for Support Vector Machines using Gaussian Kernels∗†‡
We establish learning rates up to the order of n−1 for support vector machines with hinge loss (L1-SVMs) and nontrivial distributions. For the stochastic analysis of these algorithms we use recently developed concepts such as Tsybakov’s noise assumption and local Rademacher averages. Furthermore we introduce a new geometric noise condition for distributions that is used to bound the approximati...
متن کامل